314 research outputs found

    Statistical and Computational Tradeoff in Genetic Algorithm-Based Estimation

    Full text link
    When a Genetic Algorithm (GA), or a stochastic algorithm in general, is employed in a statistical problem, the obtained result is affected by both variability due to sampling, that refers to the fact that only a sample is observed, and variability due to the stochastic elements of the algorithm. This topic can be easily set in a framework of statistical and computational tradeoff question, crucial in recent problems, for which statisticians must carefully set statistical and computational part of the analysis, taking account of some resource or time constraints. In the present work we analyze estimation problems tackled by GAs, for which variability of estimates can be decomposed in the two sources of variability, considering some constraints in the form of cost functions, related to both data acquisition and runtime of the algorithm. Simulation studies will be presented to discuss the statistical and computational tradeoff question.Comment: 17 pages, 5 figure

    A generalization of periodic autoregressive models for seasonal time series

    Get PDF
    Many nonstationary time series exhibit changes in the trend and seasonality structure, that may be modeled by splitting the time axis into different regimes. We propose multi-regime models where, inside each regime, the trend is linear and seasonality is explained by a Periodic Autoregressive model. In addition, for achieving parsimony, we allow season grouping, i.e. seasons may consists of one, two, or more consecutive observations. Since the set of possible solutions is very large, the choice of number of regimes, change times and order and structure of the Autoregressive models is obtained by means of a Genetic Algorithm, and the evaluation of each possible solution is left to an identication criterion such as AIC, BIC or MDL. The comparison and performance of the proposed method are illustrated by a real data analysis. The results suggest that the proposed procedure is useful for analyzing complex phenomena with structural breaks, changes in trend and evolving seasonality

    SMEs finance and bankruptcies: The role of credit guarantee schemes in the UK

    Get PDF
    With reference to consideration on the future of credit guarantees in the world, it may be interesting to analyze the UK framework of guarantee schemes in favour of SMEs, where there are no private guarantee providers and where there is a substantial public monopoly. In particular, in this paper it emerges that among the countries examined by the OECD, the country where the credit guarantees are least widespread is the United Kingdom. However, the trend in bankruptcies recorded in recent years by British firms is better than the median of the other countries considered. Results from the regression analysis show that among years of operation where EFG has been introduced, possibly this kind of government loan guarantees scheme for SMEs played a minor role, compared to macroeconomic indicators as GDP, in dealing with SMEs bankruptcies

    Identification of multiregime periodic autotregressive models by genetic algorithms

    Get PDF
    .This paper develops a procedure for identifying multiregimePeriodic AutoRegressive (PAR) models. In each regime a possibly dif-ferent PAR model is built, for which changes can be due to the seasonalmeans, the autocorrelation structure or the variances. Number and lo-cations of changepoints which subdivide the time span are detected bymeans of Genetic Algorithms (GAs), that optimize an identification cri-terion. The method is evaluated by means of simulation studies, and isthen employed to analyze shrimp fishery data

    Chaining of Maximal Exact Matches in Graphs

    Full text link
    We study the problem of finding maximal exact matches (MEMs) between a query string QQ and a labeled directed acyclic graph (DAG) G=(V,E,)G=(V,E,\ell) and subsequently co-linearly chaining these matches. We show that it suffices to compute MEMs between node labels and QQ (node MEMs) to encode full MEMs. Node MEMs can be computed in linear time and we show how to co-linearly chain them to solve the Longest Common Subsequence (LCS) problem between QQ and GG. Our chaining algorithm is the first to consider a symmetric formulation of the chaining problem in graphs and runs in O(k2V+E+kNlogN)O(k^2|V| + |E| + kN\log N) time, where kk is the width (minimum number of paths covering the nodes) of GG, and NN is the number of node MEMs. We then consider the problem of finding MEMs when the input graph is an indexable elastic founder graph (subclass of labeled DAGs studied by Equi et al., Algorithmica 2022). For arbitrary input graphs, the problem cannot be solved in truly sub-quadratic time under SETH (Equi et al., ICALP 2019). We show that we can report all MEMs between QQ and an indexable elastic founder graph in time O(nH2+m+Mκ)O(nH^2 + m + M_\kappa), where nn is the total length of node labels, HH is the maximum number of nodes in a block of the graph, m=Qm = |Q|, and MκM_\kappa is the number of MEMs of length at least κ\kappa. The results extend to the indexing problem, where the graph is preprocessed and a set of queries is processed as a batch.Comment: 19 pages, 1 figur

    Precision Electroweak Data and the Mixed Radion-Higgs Sector of Warped Extra Dimensions

    Full text link
    We derive the Lagrangian and Feynman rules up to bilinear scalar fields for the mixed Higgs-radion eigenstates interacting with Standard Model particles confined to a 3-brane in Randall-Sundrum warped geometry. We use the results to compute precision electroweak observables and compare theory predictions with experiment. We characterize the interesting regions of parameter space that simultaneously enable a very heavy Higgs mass and a very heavy radion mass, both masses being well above the putative Higgs boson mass limit in the Standard Model derived from the constraints of precision electroweak observables. For parameters consistent with the precision constraints the Higgs boson physical eigenstate is typically detectable, but its properties may be difficult to study at the Large Hadron Collider. In contrast, masses and couplings are allowed for the physical radion eigenstate that make it unobservable at the LHC. A Linear Collider will significantly improve our ability to study the Higgs eigenstate, and will typically allow detection of the radion eigenstate if it is within the machine's kinematical reach.Comment: 14 pages, 5 figures; revisions: typo correction for Feynman rules and 1 reference adde

    Contributions on evolutionary computation for statistical inference

    Get PDF
    Evolutionary Computation (EC) techniques have been introduced in the 1960s for dealing with complex situations. One possible example is an optimization problems not having an analytical solution or being computationally intractable; in many cases such methods, named Evolutionary Algorithms (EAs), have been successfully implemented. In statistics there are many situations where complex problems arise, in particular concerning optimization. A general example is when the statistician needs to select, inside a prohibitively large discrete set, just one element, which could be a model, a partition, an experiment, or such: this would be the case of model selection, cluster analysis or design of experiment. In other situations there could be an intractable function of data, such as a likelihood, which needs to be maximized, as it happens in model parameter estimation. These kind of problems are naturally well suited for EAs, and in the last 20 years a large number of papers has been concerned with applications of EAs in tackling statistical issues. The present dissertation is set in this part of literature, as it reports several implementations of EAs in statistics, although being mainly focused on statistical inference problems. Original results are proposed, as well as overviews and surveys on several topics. EAs are employed and analyzed considering various statistical points of view, showing and confirming their efficiency and flexibility. The first proposal is devoted to parametric estimation problems. When EAs are employed in such analysis a novel form of variability related to their stochastic elements is introduced. We shall analyze both variability due to sampling, associated with selected estimator, and variability due to the EA. This analysis is set in a framework of statistical and computational tradeoff question, crucial in nowadays problems, by introducing cost functions related to both data acquisition and EA iterations. The proposed method will be illustrated by means of model building problem examples. Subsequent chapter is concerned with EAs employed in Markov Chain Monte Carlo (MCMC) sampling. When sampling from multimodal or highly correlated distribution is concerned, in fact, a possible strategy suggests to run several chains in parallel, in order to improve their mixing. If these chains are allowed to interact with each other then many analogies with EC techniques can be observed, and this has led to research in many fields. The chapter aims at reviewing various methods found in literature which conjugates EC techniques and MCMC sampling, in order to identify specific and common procedures, and unifying them in a framework of EC. In the last proposal we present a complex time series model and an identification procedure based on Genetic Algorithms (GAs). The model is capable of dealing with seasonality, by Periodic AutoRegressive (PAR) modelling, and structural changes in time, leading to a nonstationary structure. As far as a very large number of parameters and possibilites of change points are concerned, GAs are appropriate for identifying such model. Effectiveness of procedure is shown on both simulated data and real examples, these latter referred to river flow data in hydrology. The thesis concludes with some final remarks, concerning also future work

    Contributions on evolutionary computation for statistical inference

    Get PDF
    Evolutionary Computation (EC) techniques have been introduced in the 1960s for dealing with complex situations. One possible example is an optimization problems not having an analytical solution or being computationally intractable; in many cases such methods, named Evolutionary Algorithms (EAs), have been successfully implemented. In statistics there are many situations where complex problems arise, in particular concerning optimization. A general example is when the statistician needs to select, inside a prohibitively large discrete set, just one element, which could be a model, a partition, an experiment, or such: this would be the case of model selection, cluster analysis or design of experiment. In other situations there could be an intractable function of data, such as a likelihood, which needs to be maximized, as it happens in model parameter estimation. These kind of problems are naturally well suited for EAs, and in the last 20 years a large number of papers has been concerned with applications of EAs in tackling statistical issues. The present dissertation is set in this part of literature, as it reports several implementations of EAs in statistics, although being mainly focused on statistical inference problems. Original results are proposed, as well as overviews and surveys on several topics. EAs are employed and analyzed considering various statistical points of view, showing and confirming their efficiency and flexibility. The first proposal is devoted to parametric estimation problems. When EAs are employed in such analysis a novel form of variability related to their stochastic elements is introduced. We shall analyze both variability due to sampling, associated with selected estimator, and variability due to the EA. This analysis is set in a framework of statistical and computational tradeoff question, crucial in nowadays problems, by introducing cost functions related to both data acquisition and EA iterations. The proposed method will be illustrated by means of model building problem examples. Subsequent chapter is concerned with EAs employed in Markov Chain Monte Carlo (MCMC) sampling. When sampling from multimodal or highly correlated distribution is concerned, in fact, a possible strategy suggests to run several chains in parallel, in order to improve their mixing. If these chains are allowed to interact with each other then many analogies with EC techniques can be observed, and this has led to research in many fields. The chapter aims at reviewing various methods found in literature which conjugates EC techniques and MCMC sampling, in order to identify specific and common procedures, and unifying them in a framework of EC. In the last proposal we present a complex time series model and an identification procedure based on Genetic Algorithms (GAs). The model is capable of dealing with seasonality, by Periodic AutoRegressive (PAR) modelling, and structural changes in time, leading to a nonstationary structure. As far as a very large number of parameters and possibilites of change points are concerned, GAs are appropriate for identifying such model. Effectiveness of procedure is shown on both simulated data and real examples, these latter referred to river flow data in hydrology. The thesis concludes with some final remarks, concerning also future work

    Block-Based Development of Mobile Learning Experiences for the Internet of Things

    Get PDF
    The Internet of Things enables experts of given domains to create smart user experiences for interacting with the environment. However, development of such experiences requires strong programming skills, which are challenging to develop for non-technical users. This paper presents several extensions to the block-based programming language used in App Inventor to make the creation of mobile apps for smart learning experiences less challenging. Such apps are used to process and graphically represent data streams from sensors by applying map-reduce operations. A workshop with students without previous experience with Internet of Things (IoT) and mobile app programming was conducted to evaluate the propositions. As a result, students were able to create small IoT apps that ingest, process and visually represent data in a simpler form as using App Inventor's standard features. Besides, an experimental study was carried out in a mobile app development course with academics of diverse disciplines. Results showed it was faster and easier for novice programmers to develop the proposed app using new stream processing blocks.Spanish National Research Agency (AEI) - ERDF fund
    corecore